Lead Abinitio Developer / Programmer
12 Months+
Remote Job (Looking for candidates within DC, MD, VA or West VA only)
PURPOSE:
The Lead Data Engineer is responsible for orchestrating, deploying, maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL, distributed and converged) with emphasis on reliability, automation and performance. This role will focus on leading the development of solutions and helping transform the company's platforms deliver data-driven, meaningful insights and value to company.
ESSENTIAL FUNCTIONS:
20% Lead the team to design, configure, implement, monitor, and manage all aspects of Data Integration Framework. Defines and develop the Data Integration best practices for the data management environment of optimal performance and reliability.
20% Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using Hadoop or equivalent MapReduce platform.
15% Provides detailed guidance and performs work related to Modeling Data Warehouse solutions in the cloud OR on-premise. Understands Dimensional Modeling, De-normalized Data Structures, OLAP, and Data Warehousing concepts.
15% Oversees the delivery of engineering data initiatives and projects. Supports long term data initiatives as well as Ad-Hoc analysis and ELT/ETL activities. Creates data collection frameworks for structured and unstructured data. Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
15% Enforces the implementation of best practices for data auditing, scalability, reliability and application performance. Develop and apply data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
10% Interprets data, analyzes results using statistical techniques, and provides ongoing reports. Executes quantitative analyses that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control procedures for data sets using data from multiple systems.
5% Improves data delivery engineering job knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies.
Qualifications
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable
accommodations may be made to enable individuals with disabilities to perform the essential functions.
Education Level: Bachelor's Degree
Required Experience: 8 years Experience in leading data engineering and cross functional team to implement scalable and fine tuned ETL/ELT solutions for optimal performance. Experience developing and updating ETL/ELT scripts. Hands-on experience with application development, relational database layout, development, data modeling.
In lieu of a Bachelor's degree, an additional 4 years of relevant work experience is required in addition to the required work experience.
ETL Design and Development experience using AbInitio ., Expert
Data Integration project experience on Hadoop Platform, preferably Cloudera., at least one project
Some AWS cloud experience and exposure to MongoDB is a big plus
Knowledge and understanding of at least one programming language (i.e., SQL, NoSQL, Python)., Expert
Knowledge and understanding of database design and implementation concepts. , Expert
Knowledge and understanding of data exchange formats., Expert
Knowledge and understanding of data movement concepts, Expert
Strong technical and analytical and problem solving skills to troubleshoot to solve a variety of problems., Exper